|
Out-of-bag (OOB) error, also called out-of-bag estimate, is a method of measuring the prediction error of random forests, boosted decision trees, and other machine learning models utilizing bootstrap aggregating to sub-sample data sampled used for training. OOB is the mean prediction error on each training sample , using only the trees that did not have in their bootstrap sample. Subsampling allows one to define an out-of-bag estimate of the prediction performance improvement by evaluating predictions on those observations which were not used in the building of the next base learner. Out-of-bag estimates help avoid the need for an independent validation dataset, but often underestimate actual performance improvement and the optimal number of iterations.〔Ridgeway, Greg (2007). (Generalized Boosted Models: A guide to the gbm package. )〕 == See also == *Boosting (meta-algorithm) *Bootstrapping (statistics) *Cross-validation (statistics) *Random forest *Random subspace method (attribute bagging) 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Out-of-bag error」の詳細全文を読む スポンサード リンク
|